A deep learning functional estimator of optimal dynamics for sampling large deviations
نویسندگان
چکیده
منابع مشابه
Large Deviations Theory and Empirical Estimator Choice
Criterion choice is such a hard problem in information recovery and in estimation and inference. In the case of inverse problems with noise, can probabilistic laws provide a basis for empirical estimator choice? That is the problem we investigate in this paper. Large Deviations Theory is used to evaluate the choice of estimator in the case of two fundamental situations-problems in modelling dat...
متن کاملImportance Sampling, Large Deviations, and Differential Games
A heuristic that has emerged in the area of importance sampling is that the changes of measure used to prove large deviation lower bounds give good performance when used for importance sampling. Recent work, however, has suggested that the heuristic is incorrect in many situations. The perspective put forth in the present paper is that large deviation theory suggests many changes of measure, an...
متن کاملFunctional large deviations for Burgers particle systems
We consider Burgers particle systems, i.e., one-dimensional systems of sticky particles with discrete white noise type initial data (not necessarily Gaussian), and describe functional large deviations for the state of the systems at any given time. For specific functionals such as maximal particle mass, particle speed, rarefaction interval, momentum, energy, etc, the research was initiated by A...
متن کاملSample Path Large Deviations and Optimal Importance Sampling for Stochastic Volatility Models
Sample path Large Deviation Principles (LDP) of the Freidlin-Wentzell type are derived for a class of diffusions which govern the price dynamics in common stochastic volatility models from Mathematical Finance. LDP are obtained by relaxing the non-degeneracy requirement on the diffusion matrix in the standard theory of Freidlin and Wentzell. As an application, a sample path LDP is proved for th...
متن کاملLarge deviations for sub-sampling from individual sequences
Consider a sequence of m deterministic points in IR d , and consider the empirical measure of a random sample (without replacements) of size n = n(m). We prove the large deviation principle and compute the resulting rate function for the latter empirical measure under the assumptions that the empirical measure of the m-sequence converges and that n=m tends to some 0 < < 1. Surprisingly, the res...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning: Science and Technology
سال: 2020
ISSN: 2632-2153
DOI: 10.1088/2632-2153/ab95a1